Evaluating Rater and Rubric Performance on a Writing Placement Exam
نویسنده
چکیده
As higher education institutions pursue internationalization in response to globalization, the lingua franca status of English is driving expectations that even in countries where English is not a national language, graduate students should have the language skills to be able to disseminate their research in English. It is within this wider context that several departments at Universidad de Los Andes (Los Andes), a well-respected private research university in Colombia, asked the Department of Languages and Socio-Cultural Studies (D-LESC) to create a program that would promote their PhD students' ability to write for publication and present at academic conferences in English. Faculty in D-LESC developed both the curriculum for the resulting Inglés para Doctorados (IPD) program and the IPD Placement Exam, which includes a reading section already in wider use at Los Andes as well as speaking and writing sections written specifically for the new program. During a pilot phase and after the IPD exam became operational, the faculty involved in test development checked its reliability and monitored how well students were being placed into IPD classes. However, as the potential consequences of test use became more extreme—shortly after the IPD program was approved, completion thought the third level (IPD 3) became required for all PhD students, and some departments began to limit admissions to their PhD programs based on IPD exam results—the lead test developer felt a more thorough evaluation of the exam's reliability and validity was in order. Thus in the spring of 2012 I joined the lead test developer in a comprehensive evaluation of the IPD Placement Exam. One part of this larger evaluation project involved investigating the writing section in order to address practical concerns of administrators and faculty in the IPD program: namely, whether raters and the scoring rubric were functioning effectively. I assumed responsibility for this part of the evaluation, and the current report is a more extensive and technical presentation of findings that will be shared with IPD program stakeholders so that they can make informed decisions about whether there are any aspects of rater training and/or scoring materials and procedures which could benefit from revision. 48 LITERATURE REVIEW One popular approach to investigating the functioning of second language performance assessments is multifaceted Rasch analysis (MFRA), an extension of the basic Rasch model used with dichotomous data. The basic Rasch model estimates the probability of success on an item as a function of the …
منابع مشابه
Exploring Novice Raters’ Textual Considerations in Independent and Negotiated Ratings
Educators often employ various training techniques to reduce raters’ subjectivity. Negotiation is a technique which can assist novice raters to co-construct a shared understanding of the writing assessment when rating collaboratively. There is little research, however, on rating behaviors of novice raters while employing negotiation techniques and the effect of negotiation on their understandin...
متن کاملDeveloping, evaluating and validating a scoring rubric for written case reports
OBJECTIVE The purpose of this study was to evaluate Family Medicine Clerkship students' writing skills using an anchored scoring rubric. In this study, we report on the assessment of a current scoring rubric (SR) used to grade written case description papers (CDP) for medical students, describe the development of a revised SR with examination of scoring consistency among faculty raters, and rep...
متن کاملFostering and evaluating reflective capacity in medical education: developing the REFLECT rubric for assessing reflective writing.
PURPOSE Reflective writing (RW) curriculum initiatives to promote reflective capacity are proliferating within medical education. The authors developed a new evaluative tool that can be effectively applied to assess students' reflective levels and assist with the process of providing individualized written feedback to guide reflective capacity promotion. METHOD Following a comprehensive searc...
متن کاملRaters’ Perception and Expertise in Evaluating Second Language Compositions
The consideration of rater training is very important in construct validation of a writing test because it is through training that raters are adapted to the use of students’ writing ability instead of their own criteria for assessing compositions (Charney, 1984). However, although training has been discussed in the literature of writing assessment, there is little research regarding raters’ pe...
متن کاملThe Effect of Critical Language Awareness on EFL Learners’ Writing Ability
The present study sought to investigate the effect of Critical Language Awareness (CLA) on upper-intermediate EFL learners’ writing ability. Thirty-two EFL learners were selected based on their scores on a placement test and were randomly assigned to control and experimental groups. In addition to the placement test, a five-paragraph essay writing test was administered to assess the partici...
متن کامل